4,712 research outputs found

    The Prevalence and Control of Bacillus and Related Spore-Forming Bacteria in the Dairy Industry

    Get PDF
    peer-reviewedMilk produced in udder cells is sterile but due to its high nutrient content, it can be a good growth substrate for contaminating bacteria. The quality of milk is monitored via somatic cell counts and total bacterial counts, with prescribed regulatory limits to ensure quality and safety. Bacterial contaminants can cause disease, or spoilage of milk and its secondary products. Aerobic spore-forming bacteria, such as those from the genera Sporosarcina, Paenisporosarcina, Brevibacillus, Paenibacillus, Geobacillus and Bacillus, are a particular concern in this regard as they are able to survive industrial pasteurization and form biofilms within pipes and stainless steel equipment. These single or multiple-species biofilms become a reservoir of spoilage microorganisms and a cycle of contamination can be initiated. Indeed, previous studies have highlighted that these microorganisms are highly prevalent in dead ends, corners, cracks, crevices, gaskets, valves and the joints of stainless steel equipment used in the dairy manufacturing plants. Hence, adequate monitoring and control measures are essential to prevent spoilage and ensure consumer safety. Common controlling approaches include specific cleaning-in-place processes, chemical and biological biocides and other novel methods. In this review, we highlight the problems caused by these microorganisms, and discuss issues relating to their prevalence, monitoring thereof and control with respect to the dairy industry.NG is funded by the Teagasc Walsh Fellowship Scheme and through the Irish Dairy Levy funded project ‘Thermodur-Out.

    Token Coherence: A New Framework for Shared-Memory Multiprocessors

    Get PDF
    Commercial workload and technology trends are pushing existing shared-memory multiprocessor coherence protocols in divergent directions. Token Coherence provides a framework for new coherence protocols that can reconcile these opposing trends

    Why On-Chip Cache Coherence is Here to Stay

    Get PDF
    Today’s multicore chips commonly implement shared memory with cache coherence as low-level support for operating systems and application software. Technology trends continue to enable the scaling of the number of (processor) cores per chip. Because conventional wisdom says that the coherence does not scale well to many cores, some prognosticators predict the end of coherence. This paper refutes this conventional wisdom by showing one way to scale on-chip cache coherence with bounded costs by combining known techniques such as: shared caches augmented to track cached copies, explicit cache eviction notifications, and hierarchical design. Based upon our scalability analysis of this proof-of-concept design, we predict that on-chip coherence and the programming convenience and compatibility it provides are here to stay

    Accelerating Science: A Computing Research Agenda

    Full text link
    The emergence of "big data" offers unprecedented opportunities for not only accelerating scientific advances but also enabling new modes of discovery. Scientific progress in many disciplines is increasingly enabled by our ability to examine natural phenomena through the computational lens, i.e., using algorithmic or information processing abstractions of the underlying processes; and our ability to acquire, share, integrate and analyze disparate types of data. However, there is a huge gap between our ability to acquire, store, and process data and our ability to make effective use of the data to advance discovery. Despite successful automation of routine aspects of data management and analytics, most elements of the scientific process currently require considerable human expertise and effort. Accelerating science to keep pace with the rate of data acquisition and data processing calls for the development of algorithmic or information processing abstractions, coupled with formal methods and tools for modeling and simulation of natural processes as well as major innovations in cognitive tools for scientists, i.e., computational tools that leverage and extend the reach of human intellect, and partner with humans on a broad range of tasks in scientific discovery (e.g., identifying, prioritizing formulating questions, designing, prioritizing and executing experiments designed to answer a chosen question, drawing inferences and evaluating the results, and formulating new questions, in a closed-loop fashion). This calls for concerted research agenda aimed at: Development, analysis, integration, sharing, and simulation of algorithmic or information processing abstractions of natural processes, coupled with formal methods and tools for their analyses and simulation; Innovations in cognitive tools that augment and extend human intellect and partner with humans in all aspects of science.Comment: Computing Community Consortium (CCC) white paper, 17 page
    • …
    corecore